Stein Shrinkage with Penalization and Second Order Efficiency in Semiparametrics

نویسندگان

  • Arnak Dalalyan
  • A. S. Dalalyan
چکیده

Abstract: The problem of estimating the centre of symmetry of an unknown periodic function observed in Gaussian white noise is considered. Using the penalized blockwise Stein method, a smoothing filter allowing to define the penalized profile likelihood is proposed. The estimator of the centre of symmetry is then the maximizer of this penalized profile likelihood. This estimator is shown to be semiparametrically adaptive and efficient. Moreover, the second order term of its risk expansion is proved to behave at least as well as the second order term for the best possible estimator using monotone smoothing filter. Under mild assumptions, this estimator is shown to be second order minimax sharp adaptive over the whole scale of Sobolev balls with smoothness β > 1. Thus, these results improve on Dalalyan, Golubev and Tsybakov (2006), where β ≥ 2 is required.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

James-Stein Shrinkage and Second Order Efficiency in Semiparametrics

Abstract: The problem of estimating the centre of symmetry of an unknown periodic function observed in Gaussian white noise is considered. Using the penalized blockwise James-Stein method, a smoothing filter allowing to define the penalized profile likelihood is proposed. The estimator of the centre of symmetry is then the maximizer of this penalized profile likelihood. This estimator is shown ...

متن کامل

Stein Shrinkage and Second-Order Efficiency for semiparametric estimation of the shift

The problem of estimating the shift (or, equivalently, the center of symmetry) of an unknown symmetric and periodic function f observed in Gaussian white noise is considered. Using the blockwise Stein method, a penalized profile likelihood with a data-driven penalization is introduced so that the estimator of the center of symmetry is defined as the maximizer of the penalized profile likelihood...

متن کامل

Shrinkage and Penalized Likelihood as Methods to Improve Predictive Accuracy

Hans C. van Houwelingen Saskia le Cessie Department of Medical Statistics, Leiden, The Netherlands P.O.Box 9604 2300 RC Leiden, The Netherlands email: [email protected] Abstract A review is given of shrinkage and penalization as tools to improve predictive accuracy of regression models. The James-Stein estimator is taken as starting point. Procedures covered are the Pre-test Estimation, ...

متن کامل

Semiparametrics, Nonparametrics and Empirical Bayes Procedures in Linear Models

In a classical parametric setup, a key factor in the implementation of the Empirical Bayes methodology is the incorporation of a suitable prior that is compatible with the parametric setup and yet lends to the estimation of the Bayes (shrinkage) factor in an empirical manner. The situation is more complex in semi-parametric and (ev,:,n more in) nonparametric models. Although the Dirichlet prior...

متن کامل

Least Absolute Shrinkage is Equivalent to Quadratic Penalization

Adaptive ridge is a special form of ridge regression, balancing the quadratic penalization on each parameter of the model. This paper shows the equivalence between adaptive ridge and lasso (least absolute shrinkage and selection operator). This equivalence states that both procedures produce the same estimate. Least absolute shrinkage can thus be viewed as a particular quadratic penalization. F...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005